home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
EnigmA Amiga Run 1997 April
/
EnigmA AMIGA RUN 17 (1997)(G.R. Edizioni)(IT)[!][issue 1997-04][EAR-CD].iso
/
EARCD
/
comm
/
www
/
GrabURL.lha
/
GrabURL.readme
next >
Wrap
Text File
|
1996-12-16
|
1KB
|
34 lines
Short: Utility to recursively download HTTP files.
Author: emonds@ere.umontreal.ca (Serge Emond)
Uploader: emonds@ere.umontreal.ca (Serge Emond)
Type: comm/www
Requires at least OS 3.0, rexxdossupport.library V2 and AmiTCP 4.0+.
Contains 4 things:
1) GrabHTTP. Can grab files using the HTTP protocol. (ARexx host)
2) UrlManager. Keep a list of urls. (ARexx host)
3) ParseHTML. Scans an HTML file and create a list o urls contained
it it.
4) GrabURL.rexx.
The #4 uses the 3 other programs to grab one or more url using the HTTP
protocol. It can collect urls recursively using AmigaDOS patterns,
if file does not already exists on disk, if it has been modified since last
grab (only if remote server supports it), etc. It can also grab for a
maximum number of byte or time. It can keep a workfile allowing you to
continue exactly where the grabbing was when it stopped.
GrabURL also have many other features.
GrabHTTP also have a small GUI (which can be turned off).
This is freeware. The source is ugly and not really optimized (ie I'm sure
it use more memory than necessary) but it works better or faster than any
of the other utilities [of the same kind] I saw on Aminet. I'm planning to
completely rewrite it (with full internal multitasking & multiple grabs at
once for a single GrabURL running) but I don't have time to do that for
now.
Comments are welcome.